Import an API and all its operations using its documentation in Azure API Management
Most of the time you would need to import an entire API collection with all of its supported operations to mask all of them or set policies on them. This can be easily done using Azure API management service. To start, log in to your Azure Portal, head over to the API management resource, and go to the API section on the left. From the options, select OpenAPI definition. Here I will use the Pet store API, https://petstore.swagger.io/ Go to the site or the site where your API stores the collection of all supported operations in JSON format. You can either have the JSON file of the collection or a website. In the OpenAPI specification put the link to your JSON collection or upload the JSON file and fill in the rest of the details and click on the Create button. You can see the list of operations appear. Hope you enjoyed this blog!
Share Story :
Mask your APIs using Azure API management
Most of the time you don’t want to show the original API URL or want to hide the original URL for security purposes in your code. This can be easily achieved if we assign this original URL to some other URL so end users can’t find out what’s the original API. So let’s see how this can be achieved. First, log in to your Azure Portal. Search for Azure API management service and create a new resource. Select the desired azure region and resource group. Give a unique name for this resource as the API URL will be based on this. For testing purposes set the Pricing tier as Developer. Proceed through the rest of the options as default and create the resource. Make sure to grab a coffee as the deployment takes around 45 minutes to 1 hour and you will see this screen only. When the deployment completes, you will receive an email, alerting you that the process is complete. Click on ‘Go to resource’ from the Deployment screen. You can find the new URL by getting inside the resource. Now let’s add a sample API so that we can mask it. In this case, I will use https://catfact.ninja/fact this api gives us random facts about cats. Go to the API section on your API management resource. Select the HTTP option, so that we can manually define this API and fill in the details Now save this API, you can see it on the left pane. Next, we have to add an operation for this API. Select GET operation and place the request to the main URL. In this case, the request was /fact We can test this API through the test tab and as you can see it’s working. Now we will use the main URL which will be used to mask the original API. As you can see this is not working. This happens as each API needs to be bound with a product. Go through the Products section and choose unlimited. ‘Starter’ has a rate limit of 5 calls per 60 seconds, The ‘Unlimited’ policy does not have this limitation. Click on ‘Add API’. Select your API. You will get a notification saying the API has been added to a product. Now to view this API we need a subscription, head over to the settings section on the left pane, disable the required subscription option, and then save it. Now again hit our main API through a browser, as you can see it’s working. I hope this blog helped you 🙂
Share Story :
Create and deploy your first Azure Function using Visual Studio
In my previous blog, we created and deployed an Azure function using the Azure Portal itself. Now In this part of the blog, we will see how we can create and deploy Azure Function using Visual Studio. To proceed you would need ASP.NET and Azure Development extensions installed on top of your Visual Studio. If you haven’t had these installed you can go to Apps and features on your device and modify the installed program by selecting the change option instead of Uninstall. After installing these extensions, Create a new Project and search for Azure functions. Give the Project any desired name. Set the trigger for the function as HTTP. A sample piece of code will get loaded. In this scenario, I will modify the default code and write my custom logic, which adds two numbers. To run this logic, press F5 button on your keyboard and copy the URL, hitting this URL will trigger our function. Paste and run this URL in a web browser. As you can see on hitting the URL the function gets triggered and our two numbers get added in the output. Now we have to publish our function, it’s been working amazingly. Right-click on the Project Name and click on the Publish button. Select the target as Azure. Select Specific Target as Azure Function. Select the desired Function app instance or create a new one. Click on the final publish button. Wait for the function to successfully get deployed. Now we will check this in the Azure Portal and trigger it from there, Log in to the Azure account. You will find the recently deployed function in the left pane in the Functions section. Open this function and get its URL. Paste and run this URL in a web browser. You can see the output. Hope you understood the process of deploying serverless Azure functions using Visual Studio. Have a great day!
Share Story :
Create your first Azure Function using the Azure Portal
Learning serverless Azure Functions and deploying them, might sound like a daunting task. Azure functions are one of the most essential features of Microsoft Azure. In this article, we will see how we can create and deploy our first function using Azure Portal itself. In the next blog, I will also demonstrate how we can publish functions using visual studio. To start, login into your Azure Portal. Search for the Function app and click on create. Select desired resource group, give your function app a unique name, and fill in the rest of the options as required. After the deployment completes, open the resource. Go to the functions section on the left-hand pane. Click on the Create button. Select the “Develop in Portal” option as we are using Azure Portal to create our function. And select “HTTP trigger” now as we want to trigger the function using its API. Go to the “Code+Test” option, you’ll find a sample code. Change values in it for a personal response. You can also change the logic of the code from default. Here I am passing my name as Body in the request (input) section. As you can see based on the request body I have received a personified response. You can use the get function URL option and display desired output on hitting the URL. The output section in the default code is the last line “return new OKobjectResult(youroutput);” Hope you learned to deploy a function using the portal from this blog. Have a great day!
Share Story :
Using the Power BI Report Builder to create and publish paginated reports.
Power BI Report Builder is a great tool to create paginated reports which can be easily printed in a proper page layout. If you used to work in SSRS Report Builder, the whole environment will look familiar. Plus Power BI report builder is a very light tool and has additional features such as directly importing a data source from an existing PowerBI report from any workspace, publishing these RDL reports to existing PowerBI workspace and embedding existing paginated reports into PowerBI dashboards. In this blog we will see how to create a report using the Power BI Report Builder. You need to download the Power BI Report Builder first, for that go to app.powerbi.com and click on the ellipses beside your profile After clicking on the Download option select the Paginated Report builder option. Install the downloaded setup file. Sign into the report builder. You can open the tool from Start Menu and right away start creating reports by adding data from the following data sources. For this blog I am using a dataset of existing PowerBI report. Navigate to the dataset of desired report in PowerBI service, click on the ellipses sign and select Create Paginated Report. Wait for the report to get processed. Open the downloaded RDL file. Since we directly imported our data source from an existing PowerBi report we don’t have to add a data source again. However we have to configure the dataset table. Right click on Datasets and select Add Dataset Choose the Data source from which the dataset should get its Data. Click on the Query Designer and wait for it to load. On the left pane you can see all the fields from the Data source. Drag the fields that you’ll be using in the report and execute the query. After previewing the dataset, click on OK You can view the dataset and its fields now in the left pane. You can insert various visuals from the Insert tab on the ribbon and populate them with fields from the dataset. After finalizing the design and features of the report, you can preview it by clicking on the Run button in the top left corner of the window. You’ll now see your paginated report, to exit this view click on Design button on the top left corner. You can also publish this report in your PowerBI Service Workspaces, However publishing requires a PowerBI Premium License. Thank you for reading, Hope this blog helped
Share Story :
Send a message/notification on Microsoft Teams as soon as an Opportunity is created in Dynamics 365 via Azure Logic Apps.
In this blog we will see the steps in order to send a automated message via Teams as soon as an Opportunity is created in Microsoft Dynamics 365. Step 1: Go to portal.azure.com and select the Azure Logic App Resource. Step 2: Enter all the details such as the Name, Resource Group, Subscription, Region, etc. required while creating a Logic App. Step 3: Select the Dynamics 365 trigger When a record is updated. Step 4: Select Opportunities entity after setting by the Dynamics 365 CRM connection Step 5: Set the data refresh time as required. Step 6: Select the IF action in the next step and the condition would be status_label=won for true. Step 7: Inside True Block select Post a message in a chat or channel option. You can also handle the condition for False block, but in this case we can leave it blank. Step 8: You can post this message in a Group, Channel or send it as a Personal chat. Step 10: Wait for the trigger successful notification. Step 11: Go to Dynamics 365 CRM and navigate to opportunities entity. Step 12: Open a test opportunity or create one if doesn’t exist and close the opportunity as won. Step 13: As soon as you closed this opportunity you may have received the following message. Hence in this blog we saw how we can send messages on MS Teams using Azure Logic Apps on triggering certain conditions. Hope this helped!
Share Story :
Export Power BI data to CSV via Power Automate visual and upload to SharePoint
After PowerAutomate was added in PowerBI as a preview feature, it is finally available for general use. In this blog we are going to use this PowerAutomate visual to export PowerBI data into CSV and upload the CSV on Sharepoint. Open PowerBI Desktop and drag the PowerAutomate Visual, if it isn’t available in the visualization pane update your PowerBI Version. Now add the columns, you need in your CSV file. Click on the ellipsis button on the visual and select edit. Create new flow and select instant cloud flow. A default PowerBI trigger would be created, click on add new step. Search “Compose” action in the search bar and select it as we have to compose PowerBI Raw data first. In the Inputs select PowerBI Data. After composing we have to convert this data into CSV, therefore add “Create a CSV table” step and add previous output. Now we have to upload this CSV into sharepoint so we will add a step to create a sharepoint file. Enter the Sharepoint address, Folder Path, give the file any name desired and put “.CSV” as suffix. Save and close the map. To run the flow, press Ctrl and left click. As you can see the file in SharePoint is created. Thanks for reading, hope this blog helped!
Share Story :
How to list all dates between two dates in PowerBI and distribute numbers evenly among them.
Consider a scenario where a start and end date along with the total duration for a particular task has been given to us. We have to distribute the total duration equally among all dates between the start and end date. We can solve this issue by the combination of Power Query and DAX. Lets see the steps First we need to generate the list of dates from start date till end date. Open Advance Editor As you can see the table I’m working on have two columns for start and end date for a particular task respectively. In Power Query we cannot generate a list between dates so first we have to convert the data type from date to numeric. This can be easily done by right clicking the desired column and changing the data type right away. After the data type of respective columns is changed into numeric, click on the “Add Column” option in the ribbon and select custom column. To generate a list in Power Query the syntax is “starting number .. Ending number” so we apply this syntax in accordance to our needs. The “Number” function is make to sure to take only numeric values to avoid any conflicts. After validating the code press the “OK” button. You can see a new column with lists. Click on the Expand button on the top right of the column. After Expanding the column you’ll see a list of numbers. Since these numbers are numeric we have to again convert them in Date format. This can be done by right clicking on the column and changing its Data Type. As you can see we can see all the dates between start and end dates now. 2. Since we generated the list of dates. We Proceed to distribute duration equally Create a new Calculated Column in PowerBI Desktop Write the following DAX. actual hours = Sheet1[original estimate]/ CALCULATE(count(Sheet1[taskid]),FILTER(Sheet1,Sheet1[taskid]=EARLIER(Sheet1[taskid]))) This code divides the Duration assigned for a task by the count of the total tasks where the task id is same. As you can see the Original Estimate column which is for total duration for a task is equally divided into a new column called “actual hours”. You cant see dates of Saturday, Sunday as I filtered these dates in Advanced Editor itself as they are non working days. This can be modified according to requirement. Thank you for reading hope this article helped
Share Story :
How to integrate entities from one CRM system to another using Tibco cloud integration
Sometimes we may feel the need to copy our data from one Microsoft Dynamics 365 CRM environment into another for scenarios such as moving data from UAT environment to production, copying data from Parent organization to Child, etc. This can be easily achieved by leveraging Tibco Cloud Integration’s simple and no code approach. Following is the method in which we can integrate our data Go to app.scribesoft.com and from the dashboard create a new integration app. Inside the app dashboard click on the Hamburger style button and then click on Create integration Map option In the page that opens up we can name our map as desired. Since I am going to integrate account entity in this tutorial I have named it as account integration. Now we have to add a source connection from which data will be extracted, here we can use existing connectors or create a new connector. After clicking the Ok button the metadata for source connector will load, in the meanwhile we can add Destination connector as well. We have to wait a while until the metadata of both source as well as destination gets loaded. After the connectors get loaded we have to drag the Query block from the source connector to the map designer as we want to Query and get the required entities first. Double click on the Query block and from the entity drop down you can select any entity you want to copy from source to destination, for this tutorial I am selecting Account entity. From the filter tab you can choose to filter records from the entity based on the fields. After adding filters, in order to check if correct data is being pulled from source along with the filters applied click on the Preview Tab and verify the values of the fields of the entity. Validate and click on Ok button. Now since there may be multiple records, for each record in source we have to create a record in the destination for doing this we use For Each loop from the Controls section. Since for each record we found in source we have to create a record in destination we will insert the Create block from destination connector in between the For loop. Now we will double click and open the Create block and select Account in the Destination Connector’s Entity. In the fields tab we will map all the necessary fields required to create an Account in CRM and then click on the OK button. If some important field isn’t mapped correctly it will throw an error during validation. Our map is now ready without any errors we will click on the Apply button and save the map and proceed for debugging. Debugging is important as it helps us to understand the flow of Data and if in case of any errors they can be tracked easily, In the Debug mode click on Start. Click on Next to see step by step execution of the map, and continue to finish running the map. After execution is complete the following window will be shown If you check the destination CRM now you will find the new record has been reflected from source. This was it, in few steps we copied Data from one CRM system to another. Thank you for reading my blog, hope it helped !!
Share Story :
Delete multiple tables or columns at once in PowerBI
While importing data into PowerBI we can choose the tables we want to load in the data model but we cant choose specific columns, although columns can be removed through advanced editor or by manually deleting them one by one this can also be done without using advance editor and multiple columns or tables can be deleted at once. To do this go into the data model section of your PowerBI report. Now expand the desired table and the press ctrl key on your keyboard and select the columns you want to delete. You can also select multiple tables using the same instruction. After selecting multiples columns or tables right click and select “Delete from the model” option. In the dialogue box that appears click on the “Delete” button. Then click on “Apply Changes” option which will appear in the window and you can see all those columns are now deleted Hope this helps!